Recurrent Network Models of Sequence Generation and Memory
نویسندگان
چکیده
منابع مشابه
Recurrent Network Models of Sequence Generation and Memory
Sequential activation of neurons is a common feature of network activity during a variety of behaviors, including working memory and decision making. Previous network models for sequences and memory emphasized specialized architectures in which a principled mechanism is pre-wired into their connectivity. Here we demonstrate that, starting from random connectivity and modifying a small fraction ...
متن کاملMemory Architectures in Recurrent Neural Network Language Models
We compare and analyze sequential, random access, and stack memory architectures for recurrent neural network language models. Our experiments on the Penn Treebank and Wikitext-2 datasets show that stack-based memory architectures consistently achieve the best performance in terms of held out perplexity. We also propose a generalization to existing continuous stack models (Joulin & Mikolov, 201...
متن کاملRealtime control of sequence generation with character based Long Short Term Memory Recurrent Neural Networks
Recurrent Neural Networks (RNNs) — particularly Long Short Term Memory (LSTM) RNNs — are a popular and very successful model for generating sequences. However, most LSTM based sequence generation techniques are currently not interactive and do not allow continuous control of the sequence generation, let alone in a gestural or expressive manner. This research investigates methods of realtime con...
متن کاملReal-time interactive sequence generation and control with Recurrent Neural Network ensembles
Recurrent Neural Networks (RNN), particularly Long Short Term Memory (LSTM) RNNs, are a popular and very successful method for learning and generating sequences. However, current generative RNN techniques do not allow real-time interactive control of the sequence generation process, thus aren’t well suited for live creative expression. We propose a method of real-time continuous control and ‘st...
متن کاملPRNN: Recurrent Neural Network with Persistent Memory
Although Recurrent Neural Network (RNN) has been a powerful tool for modeling sequential data, its performance is inadequate when processing sequences with multiple patterns. In this paper, we address this challenge by introducing an external memory and constructing a novel persistent memory augmented RNN (term as PRNN) model. The PRNN model captures the principle patterns in training sequences...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neuron
سال: 2016
ISSN: 0896-6273
DOI: 10.1016/j.neuron.2016.02.009